Meehai's blog
On value of things
2021-08-27

This is a blog post about a concept I've been dwelling for some time. It's about things, as we, humans, define them and their value.

How I define a thing is: an object or idea that has some orderliness, as opposed to being just random information. I believe that this is the main differentiator between humans and all the other beings, be it animals, plants, or artificial creatures, such as AI bots. There is no proof there aren't other extraterrestrial creatures that have the same capacity (or greater) as us to order information and make value out of it, but on Earth, we are the only ones capable of this.

Another way of thinking about this is having a low entropy construct. This video is a great and simple explainer of the concept: link . For example, if we were to have two arrays of N items of 1 byte each ([0-255]), say $$A=[1,4,133,2,22]$$ and $$B=[1,4,9,16,25]$$. The idea of a low entropy construct (or an valuable information) is that there is hidden meaning behind it. In order to encode the first array, since the pattern is random (or very hard to identify), we can assume that we'd need 5 bytes of information. However, for the second pattern, we could use 5 bytes of information, or we could try and understand the underlying pattern.
In this case, we can observe that the pattern is incremental square numbers. Thus, we could add this concept of squaring numbers to a dictionary. If we unsquare the numbers, we are now left with $$B'=[1,2,3,4,5]$$. We can follow the same procedure now and create a concept of first N numbers to the dictionary. So, instead of using 5 bytes of information and coding everything, we can create a dictionary of concepts and just reference it. In this case, we could use 4 bytes of information only, by coding $$B=[1,4,9,16,25]$$ as $$first N numbers (5) \circledast squaringNumbers$$. The 4 bytes of information are the concept of first N numbers, setting the number to N=5, using the application operator ($$\circledast$$, applying the next concept to all elements of the array) and finally, the squaring numbers concept.
This is an example of using simple operations and a stored memory of concepts to encode more efficiently data. Of course, there may be even more efficient ways of coding this particular sequence, but even so, we managed to use less data (4 bytes) for the same information by adding logic (or value) to it.

I believe there are two types of measure for the value of information: intrinsic value and societal value. Let's talk about each of them.

Intrinsic value
This value is related to the degree of compression of the raw information, as we described earlier in the entropy example. Basically, an information is more and more valuable as it compressed the raw information more. Sorting a partially sorted array is much easier than sorting a completely random array. Going further to more concrete examples, a picture of a person "says more" (transfers more information) compared to a picture of random colors. From an economic perspective, we can think of the intrinsic value as the amount of usefulness of the thing (object or idea) as an asset. An important property of it is that it vary in time. A book has the same intrinsic value in 1800, in present and 200 years from now. However, I do not speak here about the book as an object, but rather about the ideas conveyed in the book.

Societal value
This value is related to how the thing intersects with the real world it lives in at a particular point in time. The main difference is that the context of the world is the principal factor that influences this quantity. Because societies are constantly evolving, the societal value of a thing or information varies in time. From an economic perspective, the amount of usefulness of a thing is defined as the society's need for it. For example, a hammer has different degrees of value in 1600BC (big, since they are essential for survival and creating small constructions), in 1600 (still big) and in 2000 (less useful, since other electrical tools have been created that can do the same job with less effort).
The same principle can be applied to ideas as well. For example, the idea that humans are free-willed creatures, who can create their own personal values and ideas was probably very dangerous in eras where religion was the main source of societal values, however, as societies became more and more liberal this particular idea became more and more important in defining how a person should live its life (idea taken from Nietzsche's 'God is dead' passage).
In general, important things that proved the test of time throughout history are the ones with high intrinsic value (since it is constant). However, intrinsic value is limited/bounded and the speculative property of societies shouldn't be ignored. Another popular example would be the picture of Mona Lisa. Compared to other pictures in the Louvre, it is probably less intrinsic valuable, however, it has a huge societal value, which makes it very valuable. This wasn't necessarily the case when it was painted, but it got traction which now cannot be ignored.
Business, in general, are about identifying societal values instead of intrinsic ones (which are also harder to identify). Markets, however, are made around intrinsic values, but profits are still societal. One can have a great idea, but simply miss the timeline (society not ready or simply not having enough money for promotion).

It's clear that both types of values are important. Societal values exist only because we, as humans, evolved to such a point to be able to comprehend the difference between something useful and something with little use. However, since intrinsic value is timeless, scientists should pursue ideas that are beyond local trends if they want to create new markets and make breakthroughs.